Less Regret via Online Conditioning

نویسندگان

  • Matthew J. Streeter
  • H. Brendan McMahan
چکیده

We analyze and evaluate an online gradient descent algorithm with adaptive per-coordinate adjustment of learning rates. Our algorithm can be thought of as an online version of batch gradient descent with a diagonal preconditioner. This approach leads to regret bounds that are stronger than those of standard online gradient descent for general online convex optimization problems. Experimentally, we show that our algorithm is competitive with state-of-the-art algorithms for large scale machine learning problems.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Affine-Invariant Online Optimization and the Low-rank Experts Problem

We present a new affine-invariant optimization algorithm calledOnline Lazy Newton. The regret of Online Lazy Newton is independent of conditioning: the algorithm’s performance depends on the best possible preconditioning of the problem in retrospect and on its intrinsic dimensionality. As an application, we show how Online Lazy Newton can be used to achieve an optimal regret of order √ rT for t...

متن کامل

Understanding online regret experience using the theoretical lens of flow experience

Recent research has emphasized the exponential increase in the online regret experience among online users. Such experience results in poor satisfaction, brand switching, and even service discontinuity. However, little prior research has investigated the relative influence of online platform characteristics and individual differences (such as demographics) in predicting the online regret experi...

متن کامل

Sensitive Error Correcting Output Codes

We present a reduction from cost sensitive classi cation to binary classi cation based on (a modi cation of) error correcting output codes. The reduction satis es the property that regret for binary classi cation implies l2-regret of at most 2 for cost-estimation. This has several implications: 1) Any regret-minimizing online algorithm for 0/1 loss is (via the reduction) a regret-minimizing onl...

متن کامل

Sensitive Error Correcting Output Codes

Sensitive error correcting output codes are a reduction from cost sensitive classi cation to binary classi cation. They are a modi cation of error correcting output codes [3] which satisfy an additional property: regret for binary classi cation implies at most 2 l2 regret for cost-estimation. This has several implications: 1) Any 0/1 regret minimizing online algorithm is (via the reduction) a r...

متن کامل

Exploring the relationship between cognitive effort exertion and regret in online vs. offline shopping

Decision making is a fundamental building block of people’s lives. Each decision requires expenditure of cognitive effort, though to a varying degree, which is considered a valuable yet limited resource in the decision making literature. Though the importance of a cognitive effort minimization goal is well-established in the marketing literature, this paper examined how cognitive effort exertio...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1002.4862  شماره 

صفحات  -

تاریخ انتشار 2010